2 years ago
Elon Musk’s Self Driving Cars have Shown to be Bad Drivers that Put Many Lives at Risk
Nearly 400 car crashes, involving automated technology, in eleven months were reported, 273 of those being Tesla automobiles. Tesla also recalled over 500,000 cars to fix a faulty pedestrian warning sounds, and they are being investigated for two fatal crashes. While the company has doubled down claiming that their product is a driving assistant, their “Autopilot” and “full self-driving” advertising can be misleading to consumers. Tesla is already being investigated by the National Highway Traffic Safety Administration. Uber has also already been charged for a fatal crash by their autonomous vehicle which identified the bicycle rider too late. Even though the report mentioned that there was a human driver in the vehicle who didn’t pay attention, that doesn’t excuse company liability, because when you say things like automated or anything that hints self-driving, the common assumption is that the person doesn’t need to do anything as the responsibility is outsourced to the robot driving the car. The average person will believe they can even take a nap and let the AI take the wheel.
At AI’s current stage, it is nowhere near the level it needs to be to be classified as a safe “autonomous vehicle”. More testing is needed and it must demonstrate to be able to drive in chaotic environments of human drivers. People waive at intersections, flicker their headlights, and do other nonrhythmic movements that AI can miss or misinterpret. But alas, Elon Must does not want to admit that the technology is nowhere near where it needs to be. Doing so would crash his company, but that may happen anyway due to massive lawsuits.
Billionaire Dan O’Dowd’s Dawn Project shows how the Tesla self-driving car repeatedly hits a child size mannequin –
Total Comments: 0